Aspire Study

Latest Online Forms, Admit Cards, Results

Complete Probability Notes for JEE Mains, NIMCET, CUET MCA and CET MAH MCA Preparation



Probability Notes for NIMCET, JEE, CUET MCA

NIMCET Notes and JEE Mains Notes are designed to provide clear, exam-focused theory, formulas, and shortcuts for all important chapters. These notes help students understand fundamental concepts, practice high-level problems, and revise efficiently before the exam. Whether you are preparing for MCA entrance exams or engineering entrance tests, these nimcet notes and jee mains notes cover definitions, formulas, solved examples, and chapter-wise summaries in a simple and structured manner so that students can prepare faster and score higher.

1. Basic Terms

Sample Space (S): Set of all possible outcomes.
Event (E): Any subset of sample space.
Favourable Outcomes: Outcomes belonging to the event.
Probability of an event: $$P(E)=\frac{\text{Number of favourable outcomes}}{\text{Total outcomes}}$$

Example

Find the probability of getting a head in a coin toss: $$P(H)=\frac{1}{2}$$


2. Types of Events

(a) Equally Likely Events

All outcomes have equal chance.

(b) Mutually Exclusive Events

If two events cannot occur at the same time:
$$P(A \cap B) = 0$$

(c) Exhaustive Events

Events covering whole sample space.

(d) Independent Events

Occurrence of one event does not affect the other. $$P(A \cap B)=P(A)\,P(B)$$


3. Addition Rule of Probability

For any events A and B:
$$P(A \cup B)=P(A)+P(B)-P(A\cap B)$$

Special Case (Mutually Exclusive)

If events do not occur together: $$P(A \cup B)=P(A)+P(B)$$


4. Multiplication Rule

For any events: $$P(A\cap B)=P(A)\cdot P(B|A)$$

Independent Case

If A and B are independent: $$P(A\cap B)=P(A)\,P(B)$$


5. Conditional Probability

Probability of A given that B has happened: $$P(A|B)=\frac{P(A\cap B)}{P(B)}$$

Example

Two cards drawn from a deck. If first is Ace, find probability second is also Ace: $$P(A_2|A_1)=\frac{3}{51}$$


6. Bayes' Theorem

Used to reverse conditional probabilities. $$P(A|B)=\frac{P(B|A)P(A)}{P(B|A)P(A)+P(B|\bar{A})P(\bar{A})}$$


7. Random Variables

A variable representing numerical value of an outcome. Types: • Discrete • Continuous

Probability Mass Function (PMF)

For discrete X: $$P(X=x) = p(x)$$ and $$\sum p(x)=1$$

Expectation (Mean)

$$E(X)=\sum x\,p(x)$$

Variance

$$Var(X)=E(X^2)-E(X)^2$$


8. Standard Distributions

(a) Bernoulli Distribution

PMF: $$P(X=1)=p, \; P(X=0)=1-p$$ Mean: $p$ Variance: $p(1-p)$

(b) Binomial Distribution

$$P(X=k)=\binom{n}{k}p^k(1-p)^{n-k}$$ Mean: $np$ Variance: $np(1-p)$

(c) Poisson Distribution

$$P(X=k)=\frac{\lambda^k e^{-\lambda}}{k!}$$ Mean = Variance = $\lambda$

(d) Uniform Distribution

$$f(x)=\frac{1}{b-a},\; a\le x\le b$$ Mean: $\frac{a+b}{2}$ Variance: $\frac{(b-a)^2}{12}$


9. Important Results

1. $0 \le P(E) \le 1$
2. $P(S)=1$, $P(\phi)=0$
3. $P(\bar{E})=1-P(E)$
4. $P(A\cap B)=P(A)+P(B)-P(A\cup B)$
5. For independent events: $P(A|B)=P(A)$


10. Practice Questions (Exam Level)

Q1. A die is rolled. Find probability that outcome is a multiple of 3.

Solution: Fav = {3,6}, Total = 6 → $$P=\frac{2}{6}=\frac{1}{3}$$

Q2. Two coins tossed. Probability of at least one head?

Total = 4, favourable = 3 → $$P=\frac{3}{4}$$

Q3. A bag has 3 red and 5 blue balls. 2 drawn. Probability both red?

$$P=\frac{\binom{3}{2}}{\binom{8}{2}}=\frac{3}{28}$$

Q4. If $P(A)=0.4$, $P(B)=0.5$, $P(A\cap B)=0.2$ find $P(A\cup B)$.

$$P=0.4+0.5-0.2=0.7$$

Q5. In a binomial distribution $n=5$, $p=0.4$, find $P(X=2)$.

$$P=\binom52(0.4)^2(0.6)^3=0.3456$$

Probability Inequalities (NIMCET Notes and JEE Mains Notes)

These probability inequalities are useful for competitive exams like NIMCET and JEE Mains. You can directly use this content in your nimcet notes and jee mains notes.


1. Basic Probability Range

For any event $A$: $$0 \le P(A) \le 1$$


2. Complement Inequality

For an event $A$ and its complement $\bar{A}$: $$P(A) + P(\bar{A}) = 1$$ Hence, $$P(\bar{A}) = 1 - P(A)$$ and $$0 \le P(\bar{A}) \le 1$$


3. Union and Intersection Inequalities

For any two events $A$ and $B$:

Union always increases possibilities: $$P(A \cup B) \ge P(A), \quad P(A \cup B) \ge P(B)$$

Intersection always reduces possibilities: $$P(A \cap B) \le P(A), \quad P(A \cap B) \le P(B)$$


4. Addition Inequality for Two Events

Using the inclusion–exclusion principle: $$P(A \cup B) = P(A) + P(B) - P(A \cap B)$$ Since $P(A \cap B) \ge 0$: $$P(A \cup B) \le P(A) + P(B)$$

Also, as $P(A \cup B) \ge 0$ and $P(A \cap B) \le 1$: $$P(A) + P(B) - 1 \le P(A \cup B) \le P(A) + P(B)$$


5. Inequality for Union of n Events

For events $A_1, A_2, \dots, A_n$: $$P(A_1 \cup A_2 \cup \dots \cup A_n) \le \sum_{i=1}^{n} P(A_i)$$ This is a general form of the union bound.


6. Boole’s Inequality (Union Bound)

For any number of events $A_1, A_2, A_3, \dots$: $$P\left(\bigcup_{i=1}^{\infty} A_i\right) \le \sum_{i=1}^{\infty} P(A_i)$$ This is very useful when the intersections are not known.


7. Bonferroni Inequality (Two Events)

For two events $A$ and $B$: $$P(A) + P(B) - 1 \le P(A \cap B)$$ because $P(A \cup B) \le 1$ and $$P(A \cup B) = P(A) + P(B) - P(A \cap B)$$


8. Markov’s Inequality

If $X$ is a non-negative random variable and $a > 0$, then: $$P(X \ge a) \le \frac{E(X)}{a}$$ This gives an upper bound on the tail probability using only the expectation.


9. Chebyshev’s Inequality

If $X$ is a random variable with mean $\mu$ and variance $\sigma^2$, then for any $k > 0$: $$P(|X - \mu| \ge k\sigma) \le \frac{1}{k^2}$$ This inequality shows that most of the probability mass lies within a few standard deviations of the mean.


10. Jensen’s Inequality (Advanced)

Let $X$ be a random variable and $\phi$ be a convex function. Then: $$\phi(E(X)) \le E(\phi(X))$$ This inequality is widely used in advanced probability and statistics, information theory, and optimization.


11. Quick Summary

  • $0 \le P(A) \le 1$
  • $P(\bar{A}) = 1 - P(A)$
  • $P(A \cup B) \le P(A) + P(B)$
  • $P(A_1 \cup \dots \cup A_n) \le \sum P(A_i)$
  • $P(A) + P(B) - 1 \le P(A \cap B)$
  • Markov: $P(X \ge a) \le \dfrac{E(X)}{a}$
  • Chebyshev: $P(|X - \mu| \ge k\sigma) \le \dfrac{1}{k^2}$
  • Jensen (convex $\phi$): $\phi(E(X)) \le E(\phi(X))$

Disclaimer : This Information is Only Immediate Information. Complete Caution has been Taken to Create This Notification, But if There are Any Errors, We will not be Responsible. Therefore, Before filling the Form, Please Visit the Relevant Official Website and Read the Entire Information Carefully. We are Looking Forward to Updating From Time to Time but Still You Should Match the Official Website. If There is a Change in Date or Information, Contact us. In Case of Any Loss, the Student can not Submit the Information Given on This Website's Page as a Legal Document.